05. RViz Integration

RViz

While Gazebo is a physics simulator, RViz can visualize any type of sensor data being published over a ROS topic like camera images, point clouds, Lidar data, etc. This data can be a live stream coming directly from the sensor or pre-recorded data stored as a bag file. RViz is your one-stop tool to visualize all the three core aspects of a robot: Perception, Decision Making, and Actuation.

In this section, you will integrate your model into RViz and visualize data from the camera and laser sensors!

Modify the robot_description file

We will start with modifying the robot_description.launch file.

$ cd /home/workspace/catkin_ws/src/udacity_bot/launch/
$ nano robot_description.launch

Add the following after the first “param” definition.

<!-- Send fake joint values-->
  <node name="joint_state_publisher" pkg="joint_state_publisher" type="joint_state_publisher">
    <param name="use_gui" value="false"/>
  </node>

<!-- Send robot states to tf -->
  <node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" respawn="false" output="screen"/>

In the launch file, you are adding two nodes. One node uses the package joint_state_publisher that publishes joint state messages for the robot, such as the angles for the non-fixed joints. The second node uses the robot_state_publisher package that publishes the robot's state to tf (transform tree). Your robot model has several frames corresponding to each link/joint. The robot_state_publisher publishes the 3D poses of all of these links. This offers a very convenient and efficient advantage, especially for more complicated robots (like the PR2).

Modify the udacity_world launch file

Next, you need to launch RViz along with Gazebo.

$ nano udacity_world.launch

Add the following at the end of the file. After the urdf_spawner node definition.

<!--launch rviz-->
<node name="rviz" pkg="rviz" type="rviz" respawn="false"/>

The above will create a node that launches the package rviz. Let's launch it.

Launch it!

$ cd /home/workspace/catkin_ws/
$ roslaunch udacity_bot udacity_world.launch

This time both Gazebo and RViz should launch up. Once they are loaded -

Select the RViz window, and on the left side, under Displays:

  • Select “odom” for fixed frame
  • Click the “Add” button and
    • add “RobotModel”
    • add “Camera” and select the Image topic that was defined in the camera gazebo plugin
    • add “LaserScan” and select the topic that was defined in the hokuyo gazebo plugin.

Your robot model should load up in RViz.

In Gazebo, click on “Insert” and from the list add any item in the world in front of the robot.

You should be able to see the item in Rviz in the “Camera” viewer, and the Laser scan of that object as well.

While everything above is still running, open a new terminal window, and enter

rostopic pub /cmd_vel geometry_msgs/Twist  "linear:
  x: 0.1
  y: 0.0
  z: 0.0
angular:
  x: 0.0
  y: 0.0
  z: 0.1" 

The above command will publish messages to cmd_vel, a topic which was defined in the drive controller plugin. The values set for linear.x and angular.z will enable the robot to start moving in a circle! Try modifying some more values and move the robot around.

*Note: For the above command, after entering geometry_msgs/Twist, you can press Tab twice to complete the message definition, and then change the required values.

You can insert some objects in Gazebo in front of the robot and observe the sensor outputs in RViz.

In a previous lab, you incorporated a teleop package to drive the turtlebot around. Why don't you try to integrate the package with this new robot?